Variance-based Regularization with Convex Objectives

ثبت نشده
چکیده

We develop an approach to risk minimization and stochastic optimization that pro-1vides a convex surrogate for variance, allowing near-optimal and computationally2efficient trading between approximation and estimation error. Our approach builds3off of techniques for distributionally robust optimization and Owen’s empirical4likelihood, and we provide a number of finite-sample and asymptotic results char-5acterizing the theoretical performance of the estimator. In particular, we show that6our procedure comes with certificates of optimality, achieving (in some scenarios)7faster rates of convergence than empirical risk minimization by virtue of auto-8matically balancing bias and variance. We give corroborating empirical evidence9showing that in practice, the estimator indeed trades between variance and absolute10performance on a training sample, improving out-of-sample (test) performance11over standard empirical risk minimization for a number of classification problems.12

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variance-based Regularization with Convex Objectives

We develop an approach to risk minimization and stochastic optimization that provides a convex surrogate for variance, allowing near-optimal and computationally efficient trading between approximation and estimation error. Our approach builds off of techniques for distributionally robust optimization and Owen’s empirical likelihood, and we provide a number of finite-sample and asymptotic result...

متن کامل

Stochastic Variance-Reduced Cubic Regularized Newton Method

We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an ( , √ )-approximately local minimum within Õ(n/ ) second-order oracl...

متن کامل

On Second-order Properties of the Moreau-Yosida Regularization for Constrained Nonsmooth Convex Programs

In this paper, we attempt to investigate a class of constrained nonsmooth convex optimization problems, that is, piecewise C2 convex objectives with smooth convex inequality constraints. By using the Moreau-Yosida regularization, we convert these problems into unconstrained smooth convex programs. Then, we investigate the second-order properties of the Moreau-Yosida regularization η. By introdu...

متن کامل

Improved Natural Language Learning via Variance-Regularization Support Vector Machines

We present a simple technique for learning better SVMs using fewer training examples. Rather than using the standard SVM regularization, we regularize toward low weight-variance. Our new SVM objective remains a convex quadratic function of the weights, and is therefore computationally no harder to optimize than a standard SVM. Variance regularization is shown to enable dramatic improvements in ...

متن کامل

High order structural image decomposition by using non-linear and non-convex regularizing objectives

The paper addresses structural decomposition of images by using a family of non-linear and non-convex objective functions. These functions rely on `p quasi-norm estimation costs in a piecewise constant regularization framework. These objectives make image decomposition into constant cartoon levels and rich textural patterns possible. The paper shows that these regularizing objectives yield imag...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017